Fano's Inequality

Lower bound of transmission error. Upper bound of transmission uncertainty.

Formula

$$ \begin{aligned} X \to Y \to \hat{X} \\ \Downarrow \\ P_e &\geq \frac{H(X|\hat{X}) - 1}{\log |\mathcal{X}|} \geq \frac{H(X|Y) - 1}{\log |\mathcal{X}|} \\ 1+P_e\log|\mathcal{X}| &\geq H(X|\hat{X}) \leq H(X|Y) \\ H(X|Y) &\leq 1+P_e\log|\mathcal{X}| \end{aligned} $$

Proof

Suppose: $$ \begin{aligned} E = \begin{cases} 1, &\hat{X} \neq X \\ 0, &\hat{X} = X \end{cases} \\ P_e = P[E] = P[\hat{X} \neq X] \end{aligned} $$

Then:

$$ \begin{aligned} H(E, X | \hat{X}) &= H(E|\hat{X}) + H(X|E, \hat{X}) \\ &= H(X|\hat{X}) + H(E|X, \hat{X}) \end{aligned} $$

Because:

$$ \begin{aligned} \begin{cases} H(E|X, \hat{X}) &= 0 \\ H(X|E, \hat{X}) &= P_e \cdot H(X|E=1, \hat{X}) + (1 - P_e) \cdot H(X|E=0, \hat{X}) \\ &= P_e \cdot H(X|E=1, \hat{X}) + (1 - P_e) \cdot 0 \\ &\leq P_e \cdot H(X) \\ &\leq P_e \cdot \log|\mathcal{X}| \\ H(E | \hat{X}) &\leq H(E) = H_b(P_e) \end{cases} \end{aligned} $$

Therefore: $$ \begin{aligned} H(X|\hat{X}) &\leq H_b(P_e) + P_e \cdot \log|\mathcal{X}| \\ &\leq \log2 + P_e \cdot \log|\mathcal{X}| \\ &= 1 + P_e \cdot \log|\mathcal{X}| \\ P_e &\geq \frac{H(X|\hat{X}) - 1}{\log|\mathcal{X}|} \\ &\geq \frac{H(X|Y) - 1}{\log|\mathcal{X}|} &\text{[Data Processing Inequality]} \\ \end{aligned} $$

by Jon